Talk:Deepfake pornography
This article is rated C-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||
|
DeepNude
[edit]Currently, the DeepNude page redirects here. It should get its own page though, since it seems a bit harsh to make it seem that this "mixed reality" software can only be used in a bad way. I may be overthinking this, but similar to regular deepfakes, the algorithm could potentially be used in the film industry, and perhaps to address psychological and mental problems (anxieties as stage freight, frigidity, ...). Some safeguards probably need to be integrated to prevent misuse of the software. --Genetics4good (talk) 11:26, 15 November 2020 (UTC)
Deepfake CSAM section wording changes
[edit]I reworded one sentance in the Deepfake CSAM section to remove NPOV issues as it came accross as advocating a position via the Wikipedia voice (WP is not an advocacy site or PSA site). I also make it clear we were refering to deepfaek CSAM not real CSAM (as apposed to fake/computer generated/simulated CSAM). I removed the part of the statement about deepfake CSAM involving privacy violations of children because it's not clear that deepfake CSAM meets the usual definitiomn of a privacy violation. Since they don't use the subject/victims actually nude parts but instead uses either nude paerts from other real (i.e. non-fake) CSAM, computer generated nudity, or use the nude body/body parts from adults nudes, whether they meet the definition of a privacy violation in the usual sense is dabatable. The nude parts of a deepfake CSAM don't involves actaul child of whose head is being used in the deepfake and presumably even if the nude parts involve an actually nude body of a child there it isn't a direct violation of the subjects/victims privacy since it's not actaully them being depicted nude. Privacy violation might be an issue if a real child's nude parts where used in a indentfiable way to make the deepfake, but I doubt most deefake CSAM incorperating acutally child nudity would allow for any indefiabilty beyinf the head being used. Real (as in non-fake) CSAM clearly invades privacy becuase the child is actually identifiable in the photo or video. Now if someone believes that all or any deepfake CSAM still involves a privacy violation of a child of some sort then and you want to add that claim back in then I'd recomend you find a source that argues this specifically and cite the source. Do not simply assume that the privacy issue as previously applied to real CSAM also applies to fake/deepfake CSAM since that violates WP's Sythesis policy given that current source listed doesn't support that conclusion.One last thing, the last sentance about how deepfake CSAM hinders law enforcement has no citation and while I don't dounbt that it does in some ways it still should be cited so I'll leave in for now but unless it receives a citation within a reasonable time frame it will have to be removed, per WP citation policy. --Notcharliechaplin (talk) 17:07, 11 August 2021 (UTC)
Wiki Education assignment: History of Sexuality
[edit]This article was the subject of a Wiki Education Foundation-supported course assignment, between 10 January 2023 and 19 April 2023. Further details are available on the course page. Student editor(s): Rgxo (article contribs).
— Assignment last updated by Rgxo (talk) 16:50, 20 April 2023 (UTC)
Wiki Education assignment: Seminar in Human Sexuality
[edit]This article was the subject of a Wiki Education Foundation-supported course assignment, between 21 August 2023 and 4 December 2023. Further details are available on the course page. Student editor(s): Privacyan (article contribs).
— Assignment last updated by Zy175311460 (talk) 23:21, 3 April 2024 (UTC)